State Noise Effects on the Stochastic Gradient Descent Optimization Method for Echo State Networks with Leaky Integrator Neurons

نویسندگان

  • Anton Kirilov
  • Herbert Jaeger
چکیده

Executive Summary Echo state networks (ESNs) are a novel approach to modeling the nonlinear dynamical systems that abound in the sciences and engineering. They employ artificial recurrent neural networks in a way that has been independently proposed as a learning mechanism in biological brains and lead to a fast and simple algorithm for supervised training. ESNs are controlled by several global learning parameters that may be optimized using techniques such as the stochastic gradient descent method in order to improve the performance of the network. The research is an empirical study of the stochastic gradient descent method leading to a better understanding of that optimization approach, its strengths and limitations, which results in more efficient applications of ESNs in all basic tasks of signal processing and control, including time series prediction, inverse modeling, pattern generation, event detection and classification, modeling distributions of stochastic processes, filtering and nonlinear control.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Optimization and applications of echo state networks with leaky- integrator neurons

Standard echo state networks (ESNs) are built from simple additive units with a sigmoid activation function. Here we investigate ESNs whose reservoir units are leaky integrator units. Units of this type have individual state dynamics, which can be exploited in various ways to accommodate the network to the temporal characteristics of a learning task. We present stability conditions, introduce a...

متن کامل

The combination of circle topology and leaky integrator neurons remarkably improves the performance of echo state network on time series prediction

Recently, echo state network (ESN) has attracted a great deal of attention due to its high accuracy and efficient learning performance. Compared with the traditional random structure and classical sigmoid units, simple circle topology and leaky integrator neurons have more advantages on reservoir computing of ESN. In this paper, we propose a new model of ESN with both circle reservoir structure...

متن کامل

Conjugate gradient neural network in prediction of clay behavior and parameters sensitivities

The use of artificial neural networks has increased in many areas of engineering. In particular, this method has been applied to many geotechnical engineering problems and demonstrated some degree of success. A review of the literature reveals that it has been used successfully in modeling soil behavior, site characterization, earth retaining structures, settlement of structures, slope stabilit...

متن کامل

TimeWarping Invariant Echo State Networks

Echo State Networks (ESNs) is a recent simple and powerful approach to training recurrent neural networks (RNNs). In this report we present a modification of ESNs-time warping invariant echo state networks (TWIESNs) that can effectively deal with time warping in dynamic pattern recognition. The standard approach to classify time warped input signals is to align them to candidate prototype patte...

متن کامل

Stability and Generalization of Learning Algorithms that Converge to Global Optima

We establish novel generalization bounds for learning algorithms that converge to global minima. We do so by deriving black-box stability results that only depend on the convergence of a learning algorithm and the geometry around the minimizers of the loss function. The results are shown for nonconvex loss functions satisfying the Polyak-Łojasiewicz (PL) and the quadratic growth (QG) conditions...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2007